60 research outputs found

    Fast computation of the performance evaluation of biometric systems: application to multibiometric

    Full text link
    The performance evaluation of biometric systems is a crucial step when designing and evaluating such systems. The evaluation process uses the Equal Error Rate (EER) metric proposed by the International Organization for Standardization (ISO/IEC). The EER metric is a powerful metric which allows easily comparing and evaluating biometric systems. However, the computation time of the EER is, most of the time, very intensive. In this paper, we propose a fast method which computes an approximated value of the EER. We illustrate the benefit of the proposed method on two applications: the computing of non parametric confidence intervals and the use of genetic algorithms to compute the parameters of fusion functions. Experimental results show the superiority of the proposed EER approximation method in term of computing time, and the interest of its use to reduce the learning of parameters with genetic algorithms. The proposed method opens new perspectives for the development of secure multibiometrics systems by speeding up their computation time.Comment: Future Generation Computer Systems (2012

    Web-Based Benchmark for Keystroke Dynamics Biometric Systems: A Statistical Analysis

    Full text link
    Most keystroke dynamics studies have been evaluated using a specific kind of dataset in which users type an imposed login and password. Moreover, these studies are optimistics since most of them use different acquisition protocols, private datasets, controlled environment, etc. In order to enhance the accuracy of keystroke dynamics' performance, the main contribution of this paper is twofold. First, we provide a new kind of dataset in which users have typed both an imposed and a chosen pairs of logins and passwords. In addition, the keystroke dynamics samples are collected in a web-based uncontrolled environment (OS, keyboards, browser, etc.). Such kind of dataset is important since it provides us more realistic results of keystroke dynamics' performance in comparison to the literature (controlled environment, etc.). Second, we present a statistical analysis of well known assertions such as the relationship between performance and password size, impact of fusion schemes on system overall performance, and others such as the relationship between performance and entropy. We put into obviousness in this paper some new results on keystroke dynamics in realistic conditions.Comment: The Eighth International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIHMSP 2012), Piraeus : Greece (2012

    Keystroke Dynamics Authentication For Collaborative Systems

    Full text link
    We present in this paper a study on the ability and the benefits of using a keystroke dynamics authentication method for collaborative systems. Authentication is a challenging issue in order to guarantee the security of use of collaborative systems during the access control step. Many solutions exist in the state of the art such as the use of one time passwords or smart-cards. We focus in this paper on biometric based solutions that do not necessitate any additional sensor. Keystroke dynamics is an interesting solution as it uses only the keyboard and is invisible for users. Many methods have been published in this field. We make a comparative study of many of them considering the operational constraints of use for collaborative systems

    Evaluation of Biometric Systems

    Get PDF
    International audienceBiometrics is considered as a promising solution among traditional methods based on "what we own" (such as a key) or "what we know" (such as a password). It is based on "what we are" and "how we behave". Few people know that biometrics have been used for ages for identification or signature purposes. In 1928 for example, fingerprints were used for women clerical employees of Los Angeles police department as depicted in Figure 1. Fingerprints were also already used as a signature for commercial exchanges in Babylon (-3000 before JC). Alphonse Bertillon proposed in 1879 to use anthropometric information for police investigation. Nowadays, all police forces in the world use this kind of information to resolve crimes. The first prototypes of terminals providing an automatic processing of the voice and digital fingerprints have been defined in the middle of the years 1970. Nowadays, biometric authentication systems have many applications [1]: border control, e-commerce, etc. The main benefits of this technology are to provide a better security, and to facilitate the authentication process for a user. Also, it is usually difficult to copy the biometric characteristics of an individual than most of other authentication methods such as passwords. Despite the obvious advantages of biometric systems, their proliferation was not as much as attended. The main drawback is the uncertainty of the verification result. By contrast to password checking, the verification of biometric raw data is subject to errors and represented by a similarity percentage (100% is never reached). Others drawbacks related to vulnerabilities and usability issues exist. In addition, in order to be used in an industrial context, the quality of a biometric system must be precisely quantified. We need a reliable evaluation methodology in order to put into obviousness the benefit of a new biometric system. Moreover, many questions remain: Shall we be confident in this technology? What kind of biometric modalities can be used? What are the trends in this domain? The objective of this chapter is to answer these questions, by presenting an evaluation methodology of biometric systems

    Blind Image Quality Assessment for Face Pose Problem

    Get PDF
    No-Reference image quality assessment for face images is of high interest since it can be required for biometric systems such as biometric passport applications to increase system performance. This can be achieved by controlling the quality of biometric sample images during enrollment. This paper proposes a novel no-reference image quality assessment method that extracts several image features and uses data mining techniques for detecting the pose variation problem in facial images. Using subsets from three public 2D face databases PUT, ENSIB, and AR, the experimental results recorded a promising accuracy of 97.06% when using the RandomForest Classifier, which outperforms other classifier

    Downscaling Using CDAnet Under Observational and Model Noises: The Rayleigh-Benard Convection Paradigm

    Full text link
    Efficient downscaling of large ensembles of coarse-scale information is crucial in several applications, such as oceanic and atmospheric modeling. The determining form map is a theoretical lifting function from the low-resolution solution trajectories of a dissipative dynamical system to their corresponding fine-scale counterparts. Recently, a physics-informed deep neural network ("CDAnet") was introduced, providing a surrogate of the determining form map for efficient downscaling. CDAnet was demonstrated to efficiently downscale noise-free coarse-scale data in a deterministic setting. Herein, the performance of well-trained CDAnet models is analyzed in a stochastic setting involving (i) observational noise, (ii) model noise, and (iii) a combination of observational and model noises. The analysis is performed employing the Rayleigh-Benard convection paradigm, under three training conditions, namely, training with perfect, noisy, or downscaled data. Furthermore, the effects of noises, Rayleigh number, and spatial and temporal resolutions of the input coarse-scale information on the downscaled fields are examined. The results suggest that the expected l2-error of CDAnet behaves quadratically in terms of the standard deviations of the observational and model noises. The results also suggest that CDAnet responds to uncertainties similar to the theorized and numerically-validated CDA behavior with an additional error overhead due to CDAnet being a surrogate model of the determining form map

    Oil Spill Risk Analysis For The NEOM Shoreline

    Full text link
    A risk analysis is conducted considering several release sources located around the NEOM shoreline. The sources are selected close to the coast and in neighboring regions of high marine traffic. The evolution of oil spills released by these sources is simulated using the MOHID model, driven by validated, high-resolution met-ocean fields of the Red Sea. For each source, simulations are conducted over a 4-week period, starting from first, tenth and twentieth days of each month, covering five consecutive years. A total of 48 simulations are thus conducted for each source location, adequately reflecting the variability of met-ocean conditions in the region. The risk associated with each source is described in terms of amount of oil beached, and by the elapsed time required for the spilled oil to reach the NEOM coast, extending from the Gulf of Aqaba in the North to Duba in the South. A finer analysis is performed by segmenting the NEOM shoreline, based on important coastal development and installation sites. For each subregion, source and release event considered, a histogram of the amount of volume beached is generated, also classifying individual events in terms of the corresponding arrival times. In addition, for each subregion considered, an inverse analysis is conducted to identify regions of dependence of the cumulative risk, estimated using the collection of all sources and events considered. The transport of oil around the NEOM shorelines is promoted by chaotic circulations and northwest winds in summer, and a dominant cyclonic eddy in winter. Hence, spills originating from release sources located close to the NEOM shorelines are characterized by large monthly variations in arrival times, ranging from less than a week to more than two weeks. Large variations in the volume fraction of beached oil, ranging from less then 50\% to more than 80% are reported.Comment: 15 pages, 8 figure

    Quality of Care Assessment and Adherence to the International Guidelines Considering Dialysis, Water Treatment and Protection Against Transmission of Infections in University Hospital Based Dialysis Unit in Cairo, Egypt

    Get PDF
    Abstract Background: End stage Renal Disease (ESRD) has emerged as a major public health problem around the world. In recent decades, several important advances have been made in the therapy of hemodialysis (HD) with introduction of international guidelines to ensure the delivery of optimum care to HD patients. Increased mortality risk in HD patients unable to meet six targets in different areas of HD practice has been reported by the DOPPS investigators

    Variance-based sensitivity analysis of oil spill predictions in the Red Sea region

    Get PDF
    To support accidental spill rapid response efforts, oil spill simulations may generally need to account for uncertainties concerning the nature and properties of the spill, which compound those inherent in model parameterizations. A full detailed account of these sources of uncertainty would however require prohibitive resources needed to sample a large dimensional space. In this work, a variance-based sensitivity analysis is conducted to explore the possibility of restricting a priori the set of uncertain parameters, at least in the context of realistic simulations of oil spills in the Red Sea region spanning a two-week period following the oil release. The evolution of the spill is described using the simulation capabilities of Modelo Hidrodinâmico, driven by high-resolution metocean fields of the Red Sea (RS) was adopted to simulate accidental oil spills in the RS. Eight spill scenarios are considered in the analysis, which are carefully selected to account for the diversity of metocean conditions in the region. Polynomial chaos expansions are employed to propagate parametric uncertainties and efficiently estimate variance-based sensitivities. Attention is focused on integral quantities characterizing the transport, deformation, evaporation and dispersion of the spill. The analysis indicates that variability in these quantities may be suitably captured by restricting the set of uncertain inputs parameters, namely the wind coefficient, interfacial tension, API gravity, and viscosity. Thus, forecast variability and confidence intervals may be reasonably estimated in the corresponding four-dimensional input space

    Towards an end-to-end analysis and prediction system for weather, climate, and marine applications in the Red Sea

    Get PDF
    Author Posting. © American Meteorological Society, 2021. This article is posted here by permission of American Meteorological Society for personal use, not for redistribution. The definitive version was published in Bulletin of the American Meteorological Society 102(1), (2021): E99-E122, https://doi.org/10.1175/BAMS-D-19-0005.1.The Red Sea, home to the second-longest coral reef system in the world, is a vital resource for the Kingdom of Saudi Arabia. The Red Sea provides 90% of the Kingdom’s potable water by desalinization, supporting tourism, shipping, aquaculture, and fishing industries, which together contribute about 10%–20% of the country’s GDP. All these activities, and those elsewhere in the Red Sea region, critically depend on oceanic and atmospheric conditions. At a time of mega-development projects along the Red Sea coast, and global warming, authorities are working on optimizing the harnessing of environmental resources, including renewable energy and rainwater harvesting. All these require high-resolution weather and climate information. Toward this end, we have undertaken a multipronged research and development activity in which we are developing an integrated data-driven regional coupled modeling system. The telescopically nested components include 5-km- to 600-m-resolution atmospheric models to address weather and climate challenges, 4-km- to 50-m-resolution ocean models with regional and coastal configurations to simulate and predict the general and mesoscale circulation, 4-km- to 100-m-resolution ecosystem models to simulate the biogeochemistry, and 1-km- to 50-m-resolution wave models. In addition, a complementary probabilistic transport modeling system predicts dispersion of contaminant plumes, oil spill, and marine ecosystem connectivity. Advanced ensemble data assimilation capabilities have also been implemented for accurate forecasting. Resulting achievements include significant advancement in our understanding of the regional circulation and its connection to the global climate, development, and validation of long-term Red Sea regional atmospheric–oceanic–wave reanalyses and forecasting capacities. These products are being extensively used by academia, government, and industry in various weather and marine studies and operations, environmental policies, renewable energy applications, impact assessment, flood forecasting, and more.The development of the Red Sea modeling system is being supported by the Virtual Red Sea Initiative and the Competitive Research Grants (CRG) program from the Office of Sponsored Research at KAUST, Saudi Aramco Company through the Saudi ARAMCO Marine Environmental Center at KAUST, and by funds from KAEC, NEOM, and RSP through Beacon Development Company at KAUST
    • …
    corecore